Direct conditional probability density estimation with sparse feature selection
نویسندگان
چکیده
منابع مشابه
Conditional Density Estimation with Class Probability Estimators
Many regression schemes deliver a point estimate only, but often it is useful or even essential to quantify the uncertainty inherent in a prediction. If a conditional density estimate is available, then prediction intervals can be derived from it. In this paper we compare three techniques for computing conditional density estimates using a class probability estimator, where this estimator is ap...
متن کاملConditional Probability Estimation
This paper studies in particular an aspect of the estimation of conditional probability distributions by maximum likelihood that seems to have been overlooked in the literature on Bayesian networks: The information conveyed by the conditioning event should be included in the likelihood function as well.
متن کاملUsing zero-norm constraint for sparse probability density function estimation
A new sparse kernel probability density function (pdf) estimator based on zero-norm constraint is constructed using the classical Parzen window (PW) estimate as the target function. The so-called zero-norm of the parameters is used in order to achieve enhanced model sparsity, and it is suggested to minimize an approximate function of the zero-norm. It is shown that under certain condition, the ...
متن کاملConditional Default Probability and Density
This paper is dedicated to our friend Marek, for his birthday. Two of us know Marek since more than 20 years, when we embarked in the adventure of Mathematics for Finance. Our paths diverged, but we always kept strong ties. Thank you, Marek, for all the fruitful discussions we have had. We hope you will find some interest in this paper and the modeling of credit risk we present, and we are look...
متن کاملBinary Feature Selection with Conditional Mutual Information
In a context of classi cation, we propose to use conditional mutual information to select a family of binary features which are individually discriminating and weakly dependent. We show that on a task of image classi cation, despite its simplicity, a naive Bayesian classi er based on features selected with this Conditional Mutual Information Maximization (CMIM) criterion performs as well as a c...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2015
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-014-5472-x